perm filename ANS.420[P,JRA]1 blob
sn#552662 filedate 1980-12-23 generic text, type C, neo UTF8
COMMENT ⊗ VALID 00003 PAGES
C REC PAGE DESCRIPTION
C00001 00001
C00002 00002
C00021 00003 (DE EVAL (X ENV)
C00023 ENDMK
C⊗;
A View of Functional Programming
Functional programming? As the name implies, it means "programming with
functions". Of course that is almost a no-content answer; however one
should ask how often one "programs with functions" in "traditional
programming". Unfortunately most languages view functionality as a poor
relation (no pun intended), basing most programming techniques on
operations that relate closely to those found on traditional computing
engines: the assignment statement, the conditional jump, the dreaded goto
--what are typically included in the class of "Von Neumann machines".
Unfortunately the constructs, that occur "naturally" on these computing
devices, are not very clean either mathematically or practically. Ignoring
the mathematics for the moment, these constructs are much too low-level to
act as acceptable structuring devices for complex processes; to compound
the misfortune, most languages simply sugar coated the base and vile
machine, removing a few of the more ragged edges from tender fingers.
These languages include Fortran, Algol, Pascal, Basic, and the
FDA-approved ADA, and COBOL. --what Backus calls the Von Neumann
languages.
Another branch of languages began in a "top-down" fashion, viewing
programming languages as a computational notation, per se, leaving the
task of execution to the capable province of machine architecture or
software simulation. These languages include APL, LISP, logic
programming, data flow, Smalltalk, and most recently the FP-related
languages of Backus. All of these languages share a common thread: they
began as notations, driven by concerns for expressing computational ideas,
with concerns for execution and implementation taking a secondary role.
The interesting --though perhaps not suprising-- result of investigating
these languages is that a major portion of their power and elegance is due
to their explicit or implicit treatment of "function". That is, the
mathematical elegance is inherited by the practical language. It's not
suprising considering that many centuries of study and effort have gone
into refining the concepts of mathematics.
So one is driven back to the idea of function as a basis of computational
notation, though one must be careful since "function" as an abstraction is
computation-independent. Recall that a function is simply a set of ordered
pairs; one may think of may algorithms that will be able to compute the
"same" function.
What does it mean to compute "as if" one was computing with functions?
What is necessary computationally to describe a functional unit? What
"useful" (when measured against traditional approaches) computing can be
done with such baroque languages. Universality arguments --in the sense
of whether or not one language cannot compute something-- are not
considered "useful" here, by the way. Any non-toy language is universal;
we're talking practical expressibility.
Let's begin by addressing the issue of "functionality" from a
computational point of view. Given a functional equation like:
F(x,y) = 2x + 4y,
high school algebra tells us that F(3,4) is 22. The rules we used for
computing that value can be classified as:
substitution rules:
substitute 3 for x and 4 for y throughout the definition of F.
F(3,4) = 2*3 + 4*y, where we have to write an explicit multiplication 2*3,
not 23.
now we apply:
simplification rules:
replace 2*3 by its "simpler" form, 6, giving
F(3,4) = 6 + 4*4; simplification now allows us to replace 4*4 by 16.
F(3,4) = 6 + 16 simplification now allows us to replace 6+16 by 22.
Besides these simplification rules, we have implicitly applied rules for
equality; for example, that if α=β and β=∂, then α=∂.
It can be shown that computation as we know it is nothing but the
appropriate collection of substitution and simplification rules, coupled
with applications of laws for equality. In its most rarified form, this
results in the study of a logical system called the "lambda calculus".
This system is expressed as a collection of axioms and rules much like one
finds for elementary geometry or formal logic. This informal notion of
"computation" finds expression in this sysstem as the notion of "proof";
that is, if one can form a proof for a statement in the calculus, then one
can intuit the proof steps as a "computation" of that statement.
****give exammple*****
Much like functionality abstracts to the idea of a set of ordered pairs
without requiring a computational rule, so too axiomatic systems do not
specify an order in which rules are to be applied. As long as a rule is
applicable, it may be applied.
***example****
Two of the traditional strategies for "function evaluation" are easily
expressed in this simple notion or rules. Call-by-value corresponds to the
strategy of always simplifying arguments before applying a substitution
rule
****example***
Call-by-name corresponds to the scheme of substitution before
simplification.
***example****
In many cases, the order of rule application is immaterial; the same
"computation" will result.
give delta and call-by value vs call by name hack
An important "non-functionality" of computing languages involves the
order-dependent conditional expression:
if p then a else b, which we shall write as if(p,a,b).
Such constructs are commonly defined such that one first evaluates p and,
depending on that value, either evaluates a or b, but not both.
****example*****
Now our simple scheme of things becomes troubled; what about:
if(p,a a).
should the value be the value of a? what if p is the dreaded delta?
more rules:
unfortunately, or perhaps fortunately, computation is not trivial to
describe and delimit. However, one can make a general statement about the
phenomenon:
In the most general setting one can view general computation as the
application of well defined rules --deduction--, plus a strategy for the
application of those rules --control information.
***compare with kowalski and wirth
Let this esoteric business of "what is a function" settle for awhile (we
will return to it) and let's see what one can do with functions as "data
objects".
We see "data objects" in all languages; Fortran, Pascal, and LISP have
numbers as data objects. The meaning is clear: we can compute with them as
values. What does it mean to compute with functions as objects? Well,
just as in the Fortran-Pascal case, where one has functions that operate
on data (numbers), we should have functions that operate on data (now
functions). Functions that operate on functions are called "functionals"
or sometimes, "operators".
function as data as entity culler as abstrraction lisp
I personally discovered functionality in computing languages with the
Culler-Freid system in the early 1960's. The data items in this language
were finite segments of functions
***foresight of this system was amazing***
function as data as entity: culler; as abstrraction: lisp
representability of function in language: sequences/trees
With that we jumped on functions with a vengance, using a simple
"functional" language that was able to operate on "functional"
representations.
the simplicity of the functional language --a LISP subset was used to
discuss the novelty of allowing functions as passive data objects; this
balanced a simple, almost trivial language, against a novel and elegant
idea (traditional and most other functional formalisms do not allow
functions as data items).
given the novelty of functional as data structures, one should be able to
demonstrate significant applications if this novelty has substance.
Note: its a characterisitic of VN languages.
maybe it's another out-dated hold-over from the Von Newmann era of the
"stored program machine". (by the way, the major hallmark of Von Neumann
machines is, to me, the "stored program concept" not the particular
languages that we stuff into them)
enter the EVAL/APPLY pair that show an elegant and useful application of
the representation: one can succinctly describe the evaluation
(implementation) of the language IN THE LANGUAGE ITSELF. It's more than
"cute"; its a fundamental result in computer science. ***eval-apply
without funargs about here***
And now we have a "first-order" understanding of functionality.
We let the "data objects" now become "active" in the sense that they can
appear in the "function" position of an application --these things occured
in the context of "mapping" functions, aka functionals, aka operators;
****mapping examples******
that is, functions that that functions as arguments. Now things get more
interesting.
Building on your understanding of the EVAL/APPLY pair, we investigated the
"true meaning" of what it means to "compute with functions".
*****eval-apply with funargs here******
This showed that functionality is captured only by including the "free
variables" of a function as part-and-parcel of the function. That is, a
function is the text plus the environment. This result is interesting for
two reasons; first it solves a problem that plagued(s) traditional
languages that tried to implement functional objects (note: a language
(e.g. algol) can have functional objects without having a data structure
representation for them --Occam's (sp) razor says its a bad idea though).
More generally however, this idea of "capturing state" with a fun. obj.
extends beyond the "free variable" problem; it gives us an elegant
expression of the ideas behind the bullshit of "modularity" and "message
passing": first-class functional objects.
One can view "message passing" as an application of functional programming
where the target of the message is a constructed functional object that
contains local state information plus a table of messages that the target
can respond too.
****examples ******
*****add p-lists****
One can view object-oriented programming --where the action to be taken is
a function of the class to which the object belongs-- as an instance of
functional programming in which the activities are functions associated
with the names of the activities in the classes that describe the instance
(huh?).
called data driven in lisp
That is, an object is an instance of a class; to perform an activity on an
instance, examine its class and invoke the appropriate function defined
for that activity.
****add examples of data-driven*****
Indeed message passing and object-oriented/data-driven programming are two
sides of the same coin; but both are views of how to apply functionality
in programming.
One can also examine Iverson's views of Operators and Backus' study of RED
languages and FP-related systems as applications of functionality
(assignment: read Backus Turing Lecture; write Insert and ApplyToAll as
mapping functions).
summarize
bibliography
lisp conf proceedings
aip
anatomy
winston
bankruptcy
backus
iverson turing
iverson toplas
(DE EVAL (X ENV)
(COND ((IS-CONST X) (VAL X))
((IS-VAR X) (LOOKUP X ENV))
((IS-IF X) (IF (EVAL (PREM X) ENV)
(EVAL (CONSE X) ENV)
(EVAL (ANTE X) ENV)))
((IS-LAM X) (MK-FUNARG X
ENV)))
(T (APPLY (EVAL (FUN X) ENV)
(EVLIS (ARGS X) ENV)
ENV))))
(DE APPLY (FN EARGS ENV)
(COND ((IS-PRIM FN) (DOIT FN EARGS))
((IS-FUNARG FN) (EVAL (BODY FN)
(MK-ENV (FORM FN)
EARGS
(ENVIR FN)))
(DE EVLIS (L ENV)
(IF (NULL L)
( )
(CONCAT (EVAL (FIRST L) ENV)
(EVLIS (REST L) ENV))))
(DE LOOKUP (N ST)
(COND ((NULL ST) error)
(T (LOOKUP1 N (NAMES (FIRST ST))(VALUES (FIRST ST)) ST))))
(DE LOOKUP1 (N NAMES VALUES ST)
(COND ((MT-Y NAMES)(LOOKUP N (REST ST)))
((EQ N (GET-N NAMES)) (GET-V VALUES))
(T (LOOKUP1 N (TAIL NAMES) (TAIL VALUES) ST))))